On divergences, surrogate loss functions, and decentralized detection

نویسندگان

  • XuanLong Nguyen
  • Martin J. Wainwright
  • Michael I. Jordan
چکیده

We develop a general correspondence between a family of loss functions that act as surrogates to 0-1 loss, and the class of Ali-Silvey or f -divergence functionals. This correspondence provides the basis for choosing and evaluating various surrogate losses frequently used in statistical learning (e.g., hinge loss, exponential loss, logistic loss); conversely, it provides a decision-theoretic framework for the choice of divergences in signal processing and quantization theory. We exploit this correspondence to characterize the statistical behavior of a nonparametric decentralized hypothesis testing algorithms that operate by minimizing convex surrogate loss functions. In particular, we specify the family of loss functions that are equivalent to 0-1 loss in the sense of producing the same quantization rules and discriminant functions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Information Divergence Measures, Surrogate Loss Functions and Decentralized Hypothesis Testing

We establish a general correspondence between two classes of statistical functions: AliSilvey distances (also known as f -divergences) and surrogate loss functions. Ali-Silvey distances play an important role in signal processing and information theory, for instance as error exponents in hypothesis testing problems. Surrogate loss functions (e.g., hinge loss, exponential loss) are the basis of ...

متن کامل

On distance measures, surrogate loss functions, and distributed detection

In this paper, we show the correspondence between distance measures and surrogate loss functions in the context of decentralized binary hypothesis testing. This correspondence helps explicate the use of various distance measures in signal processing and quantization theory, as well as explain the behavior of surrogate loss functions often used in machine learning and statistics. We then develop...

متن کامل

Divergences, surrogate loss functions and experimental design

In this paper, we provide a general theorem that establishes a correspondence between surrogate loss functions in classification and the family of f -divergences. Moreover, we provide constructive procedures for determining the f -divergence induced by a given surrogate loss, and conversely for finding all surrogate loss functions that realize a given f -divergence. Next we introduce the notion...

متن کامل

Information Divergence Measures and Surrogate Loss Functions

In this extended abstract, we provide an overview of our recent work on the connection between information divergence measures and convex surrogate loss functions used in statistical machine learning. Further details can be found in the technical report [7] and conference paper [6]. The class of f -divergences, introduced independently by Csiszar [4] and Ali and Silvey [1], arise in many areas ...

متن کامل

Convexity, Detection, and, Generalized f-divergences

The goal of multi-class classification problem is to find a discriminant function that minimizes the expectation of 0-1 loss function. However, minimizing 0-1 loss directly is often computationally intractable and practical algorithms are usually based on convex relaxations of 0-1 loss, say Φ, which is called the surrogate loss. In many applications, the covariates are either not available dire...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/math/0510521  شماره 

صفحات  -

تاریخ انتشار 2005